44 research outputs found
Analysis of signalling pathways using the prism model checker
We describe a new modelling and analysis approach for signal
transduction networks in the presence of incomplete data. We illustrate
the approach with an example, the RKIP inhibited ERK pathway
[1]. Our models are based on high level descriptions of continuous time
Markov chains: reactions are modelled as synchronous processes and concentrations
are modelled by discrete, abstract quantities. The main advantage
of our approach is that using a (continuous time) stochastic logic
and the PRISM model checker, we can perform quantitative analysis of
queries such as if a concentration reaches a certain level, will it remain at
that level thereafter? We also perform standard simulations and compare
our results with a traditional ordinary differential equation model. An
interesting result is that for the example pathway, only a small number
of discrete data values is required to render the simulations practically
indistinguishable
Detecting beta-amyloid aggregation from the time-resolved emission spectra
Aggregation of beta-amyloids is one of key processes responsible for the development of Alzheimer's desease. Early molecular-level detection of beta-amyloid oligomers may help in early diagnosis and in the development of new intervention therapies. Our previous studies on changes in beta-amyloid's single tyrosine intrinsic fluorescence response during aggregation demonstrated a four-exponential fluorescence intensity decay, and that the ratio of the pre-exponential factors indicated the extent of aggregation in the early stages of the process before the beta-sheets are formed. Here we present a complementary approach based on time-resolved emission spectra (TRES) of amyloid's tyrosine excited at 279 nm and fluorescent in the window 240-450 nm. TRES has been used to demonstrate sturctural changes occuring on the nanosecond time scale after excitation which has significant advantages over using steady-state spectra. We demonstrate this by resolving the fluorescent species and revealing that beta-amyloid's monomers show very fast dielectric relaxation and its oligomers display a substantial spectral shift due to dielectric relaxation, which gradually decreases when oligomers become larger
Computational inference in systems biology
Parameter inference in mathematical models of biological pathways, expressed as coupled ordinary differential equations (ODEs), is a challenging problem. The computational costs associated with repeatedly solving the ODEs are often high. Aimed at reducing this cost, new concepts using gradient matching have been proposed. This paper combines current adaptive gradient matching approaches, using Gaussian processes, with a parallel tempering scheme, and conducts a comparative evaluation with current methods used for parameter inference in ODEs
Interplay between distribution of live cells and growth dynamics of solid tumours
Experiments show that simple diffusion of nutrients and waste molecules is not sufficient to explain the typical multilayered structure of solid tumours, where an outer rim of proliferating cells surrounds a layer of quiescent but viable cells and a central necrotic region. These experiments challenge models of tumour growth based exclusively on diffusion. Here we propose a model of tumour growth that incorporates the volume dynamics and the distribution of cells within the viable cell rim. The model is suggested by in silico experiments and is validated using in vitro data. The results correlate with in vivo data as well, and the model can be used to support experimental and clinical oncology
A model checking approach to the parameter estimation of biochemical pathways
Model checking has historically been an important tool to
verify models of a wide variety of systems. Typically a model has to exhibit
certain properties to be classed ‘acceptable’. In this work we use
model checking in a new setting; parameter estimation. We characterise
the desired behaviour of a model in a temporal logic property and alter
the model to make it conform to the property (determined through
model checking). We have implemented a computational system called
MC2(GA) which pairs a model checker with a genetic algorithm. To
drive parameter estimation, the fitness of set of parameters in a model is
the inverse of the distance between its actual behaviour and the desired
behaviour. The model checker used is the simulation-based Monte Carlo
Model Checker for Probabilistic Linear-time Temporal Logic with numerical
constraints, MC2(PLTLc). Numerical constraints as well as the
overall probability of the behaviour expressed in temporal logic are used
to minimise the behavioural distance. We define the theory underlying
our parameter estimation approach in both the stochastic and continuous
worlds. We apply our approach to biochemical systems and present
an illustrative example where we estimate the kinetic rate constants in
a continuous model of a signalling pathway
Isolation of two strains of West Nile virus during an outbreak in southern Russia, 1999.
From July to September 1999, a widespread outbreak of meningoencephalitis associated with West Nile virus (Flavivirus, Flaviviridae) occurred in southern Russia, with hundreds of cases and dozens of deaths. Two strains of West Nile virus isolated from patient serum and brain-tissue samples reacted in hemagglutination-inhibition and neutralization tests with patients' convalescent-phase sera and immune ascites fluid from other strains of West Nile virus
Comparing families of dynamic causal models
Mathematical models of scientific data can be formally compared using Bayesian model evidence. Previous applications in the biological sciences have mainly focussed on model selection in which one first selects the model with the highest evidence and then makes inferences based on the parameters of that model. This “best model” approach is very useful but can become brittle if there are a large number of models to compare, and if different subjects use different models. To overcome this shortcoming we propose the combination of two further approaches: (i) family level inference and (ii) Bayesian model averaging within families. Family level inference removes uncertainty about aspects of model structure other than the characteristic of interest. For example: What are the inputs to the system? Is processing serial or parallel? Is it linear or nonlinear? Is it mediated by a single, crucial connection? We apply Bayesian model averaging within families to provide inferences about parameters that are independent of further assumptions about model structure. We illustrate the methods using Dynamic Causal Models of brain imaging data
Designing attractive models via automated identification of chaotic and oscillatory dynamical regimes
Chaos and oscillations continue to capture the interest of both the scientific and public domains. Yet despite the importance of these qualitative features, most attempts at constructing mathematical models of such phenomena have taken an indirect, quantitative approach, for example, by fitting models to a finite number of data points. Here we develop a qualitative inference framework that allows us to both reverse-engineer and design systems exhibiting these and other dynamical behaviours by directly specifying the desired characteristics of the underlying dynamical attractor. This change in perspective from quantitative to qualitative dynamics, provides fundamental and new insights into the properties of dynamical systems
A framework for parameter estimation and model selection from experimental data in systems biology using approximate Bayesian computation.
As modeling becomes a more widespread practice in the life sciences and biomedical sciences, researchers need reliable tools to calibrate models against ever more complex and detailed data. Here we present an approximate Bayesian computation (ABC) framework and software environment, ABC-SysBio, which is a Python package that runs on Linux and Mac OS X systems and that enables parameter estimation and model selection in the Bayesian formalism by using sequential Monte Carlo (SMC) approaches. We outline the underlying rationale, discuss the computational and practical issues and provide detailed guidance as to how the important tasks of parameter inference and model selection can be performed in practice. Unlike other available packages, ABC-SysBio is highly suited for investigating, in particular, the challenging problem of fitting stochastic models to data. In order to demonstrate the use of ABC-SysBio, in this protocol we postulate the existence of an imaginary reaction network composed of seven interrelated biological reactions (involving a specific mRNA, the protein it encodes and a post-translationally modified version of the protein), a network that is defined by two files containing 'observed' data that we provide as supplementary information. In the first part of the PROCEDURE, ABC-SysBio is used to infer the parameters of this system, whereas in the second part we use ABC-SysBio's relevant functionality to discriminate between two different reaction network models, one of them being the 'true' one. Although computationally expensive, the additional insights gained in the Bayesian formalism more than make up for this cost, especially in complex problems
A novel cost function to estimate parameters of oscillatory biochemical systems
Oscillatory pathways are among the most important classes of biochemical systems with examples ranging from circadian rhythms and cell cycle maintenance. Mathematical modeling of these highly interconnected biochemical networks is needed to meet numerous objectives such as investigating, predicting and controlling the dynamics of these systems. Identifying the kinetic rate parameters is essential for fully modeling these and other biological processes. These kinetic parameters, however, are not usually available from measurements and most of them have to be estimated by parameter fitting techniques. One of the issues with estimating kinetic parameters in oscillatory systems is the irregularities in the least square (LS) cost function surface used to estimate these parameters, which is caused by the periodicity of the measurements. These irregularities result in numerous local minima, which limit the performance of even some of the most robust global optimization algorithms. We proposed a parameter estimation framework to address these issues that integrates temporal information with periodic information embedded in the measurements used to estimate these parameters. This periodic information is used to build a proposed cost function with better surface properties leading to fewer local minima and better performance of global optimization algorithms. We verified for three oscillatory biochemical systems that our proposed cost function results in an increased ability to estimate accurate kinetic parameters as compared to the traditional LS cost function. We combine this cost function with an improved noise removal approach that leverages periodic characteristics embedded in the measurements to effectively reduce noise. The results provide strong evidence on the efficacy of this noise removal approach over the previous commonly used wavelet hard-thresholding noise removal methods. This proposed optimization framework results in more accurate kinetic parameters that will eventually lead to biochemical models that are more precise, predictable, and controllable